CMSC320 Final Project
Bingying Jiang
It's been almost two years since the outbreak of the COVID-19 pandemic. At its roots, the COVID-19 crisis is a global health crisis that had affected and changed our lives, it's not a financial or economic crisis. However, due to its huge effects on supply and demand conditions, it's inevitable that the COVID-19 crisis turned into a large-scale economic crisis. In this project, I'd like to explore and study the correlation between the effects of the COVID-19 cases and death on the FAANG stock prices between January 2020 to December 2021 by using Machine Learning models. Though the stock market's volatility could be affected in many aspects, as a barometer for the path of pandemics, the effect of this health crisis plays a vital role. In this article, Elaine Loh concluded that airline stock prices were more vulnerable to the influences of pandemic crises since people tend to reduce travel in that period. On the face of it, technology companies won't be affected much because they don't have direct relations. In this project, I think it would be fun to find out if there's a correlation between the COVID-19's daily cases&deaths and FAANG stock prices.
Plotly is the main tool for visualization in this project. It could easily make Professional and interactive figures in just a few lines of code, I personally think it works better than seaborn. Compared to seaborn, it allows more flexibility and customization. You might want to check more ways to create fun figures from Plotly website. Besides pandas and numpy, keras library is also used in the Machine Learning works. You probably noticed that I also installed tensorflow, which takes a while to be downloaded if you're running it on a virtual machine. I know it's a powerful tool in Machine Learning field, unfortunately, it's not used in this project, and the installation of Tensorflow is just a way to fix the keras' installation problem.
pip install plotly
Requirement already satisfied: plotly in /opt/conda/lib/python3.9/site-packages (5.4.0) Requirement already satisfied: tenacity>=6.2.0 in /opt/conda/lib/python3.9/site-packages (from plotly) (8.0.1) Requirement already satisfied: six in /opt/conda/lib/python3.9/site-packages (from plotly) (1.16.0) Note: you may need to restart the kernel to use updated packages.
pip install tensorflow
Requirement already satisfied: tensorflow in /opt/conda/lib/python3.9/site-packages (2.7.0) Requirement already satisfied: protobuf>=3.9.2 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (3.17.2) Requirement already satisfied: gast<0.5.0,>=0.2.1 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (0.4.0) Requirement already satisfied: libclang>=9.0.1 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (12.0.0) Requirement already satisfied: wheel<1.0,>=0.32.0 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (0.37.0) Requirement already satisfied: tensorboard~=2.6 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (2.7.0) Requirement already satisfied: keras<2.8,>=2.7.0rc0 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (2.7.0) Requirement already satisfied: wrapt>=1.11.0 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (1.13.3) Requirement already satisfied: termcolor>=1.1.0 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (1.1.0) Requirement already satisfied: keras-preprocessing>=1.1.1 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (1.1.2) Requirement already satisfied: google-pasta>=0.1.1 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (0.2.0) Requirement already satisfied: six>=1.12.0 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (1.16.0) Requirement already satisfied: tensorflow-io-gcs-filesystem>=0.21.0 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (0.23.1) Requirement already satisfied: flatbuffers<3.0,>=1.12 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (2.0) Requirement already satisfied: tensorflow-estimator<2.8,~=2.7.0rc0 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (2.7.0) Requirement already satisfied: astunparse>=1.6.0 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (1.6.3) Requirement already satisfied: h5py>=2.9.0 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (3.4.0) Requirement already satisfied: numpy>=1.14.5 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (1.20.3) Requirement already satisfied: grpcio<2.0,>=1.24.3 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (1.43.0) Requirement already satisfied: absl-py>=0.4.0 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (1.0.0) Requirement already satisfied: opt-einsum>=2.3.2 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (3.3.0) Requirement already satisfied: typing-extensions>=3.6.6 in /opt/conda/lib/python3.9/site-packages (from tensorflow) (3.10.0.0) Requirement already satisfied: requests<3,>=2.21.0 in /opt/conda/lib/python3.9/site-packages (from tensorboard~=2.6->tensorflow) (2.26.0) Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /opt/conda/lib/python3.9/site-packages (from tensorboard~=2.6->tensorflow) (1.8.0) Requirement already satisfied: setuptools>=41.0.0 in /opt/conda/lib/python3.9/site-packages (from tensorboard~=2.6->tensorflow) (57.4.0) Requirement already satisfied: markdown>=2.6.8 in /opt/conda/lib/python3.9/site-packages (from tensorboard~=2.6->tensorflow) (3.3.6) Requirement already satisfied: google-auth<3,>=1.6.3 in /opt/conda/lib/python3.9/site-packages (from tensorboard~=2.6->tensorflow) (2.3.3) Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /opt/conda/lib/python3.9/site-packages (from tensorboard~=2.6->tensorflow) (0.4.6) Requirement already satisfied: tensorboard-data-server<0.7.0,>=0.6.0 in /opt/conda/lib/python3.9/site-packages (from tensorboard~=2.6->tensorflow) (0.6.1) Requirement already satisfied: werkzeug>=0.11.15 in /opt/conda/lib/python3.9/site-packages (from tensorboard~=2.6->tensorflow) (2.0.2) Requirement already satisfied: rsa<5,>=3.1.4 in /opt/conda/lib/python3.9/site-packages (from google-auth<3,>=1.6.3->tensorboard~=2.6->tensorflow) (4.8) Requirement already satisfied: cachetools<5.0,>=2.0.0 in /opt/conda/lib/python3.9/site-packages (from google-auth<3,>=1.6.3->tensorboard~=2.6->tensorflow) (4.2.4) Requirement already satisfied: pyasn1-modules>=0.2.1 in /opt/conda/lib/python3.9/site-packages (from google-auth<3,>=1.6.3->tensorboard~=2.6->tensorflow) (0.2.8) Requirement already satisfied: requests-oauthlib>=0.7.0 in /opt/conda/lib/python3.9/site-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard~=2.6->tensorflow) (1.3.0) Requirement already satisfied: importlib-metadata>=4.4 in /opt/conda/lib/python3.9/site-packages (from markdown>=2.6.8->tensorboard~=2.6->tensorflow) (4.8.1) Requirement already satisfied: zipp>=0.5 in /opt/conda/lib/python3.9/site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard~=2.6->tensorflow) (3.5.0) Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /opt/conda/lib/python3.9/site-packages (from pyasn1-modules>=0.2.1->google-auth<3,>=1.6.3->tensorboard~=2.6->tensorflow) (0.4.8) Requirement already satisfied: charset-normalizer~=2.0.0 in /opt/conda/lib/python3.9/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow) (2.0.0) Requirement already satisfied: certifi>=2017.4.17 in /opt/conda/lib/python3.9/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow) (2021.5.30) Requirement already satisfied: idna<4,>=2.5 in /opt/conda/lib/python3.9/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow) (3.1) Requirement already satisfied: urllib3<1.27,>=1.21.1 in /opt/conda/lib/python3.9/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow) (1.26.6) Requirement already satisfied: oauthlib>=3.0.0 in /opt/conda/lib/python3.9/site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard~=2.6->tensorflow) (3.1.1) Note: you may need to restart the kernel to use updated packages.
from tensorflow.keras.layers import Input, Dense, LSTM
from tensorflow.keras.models import Sequential
from tensorflow.keras.utils import plot_model
2021-12-21 04:22:20.335140: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcudart.so.11.0'; dlerror: libcudart.so.11.0: cannot open shared object file: No such file or directory 2021-12-21 04:22:20.335198: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.
import requests
from bs4 import BeautifulSoup as bs
import pandas as pd
import datetime as dt
from datetime import datetime
import numpy as np
import matplotlib.pyplot as plt
from sklearn.model_selection import cross_validate
import plotly.graph_objects as go
import plotly.express as px
import plotly.offline as py
from functools import reduce
from plotly.offline import iplot
from statsmodels.formula.api import ols
from sklearn.feature_selection import f_regression
import statsmodels.api as sm
from sklearn.linear_model import LinearRegression
from sklearn.tree import DecisionTreeRegressor
from sklearn.neighbors import KNeighborsRegressor
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from sklearn.preprocessing import MinMaxScaler
from sklearn.metrics import accuracy_score
from sklearn.model_selection import train_test_split, cross_val_predict, TimeSeriesSplit, KFold, cross_val_score
The first step to data analysis is to choose and obtain a data set. In this case, I need up-to-date datasets of both Covid-19 and FAANG stock prices. I obtained world daily updating covid-19 data from Our World in Data's Github site. Since it's a huge dataset containing all the information from worldwide, I have to clean it before analysis. Nasdaq's website has nearly perfect data for the stock market. This dataset comes with information on the open, close, high, low, and volume of FAANG stock prices in the past five years. Unfortunately, it doesn't come with the data for the adjusted close price. Adjusted closing price refers to the price of the stock after paying off the dividends. Compared to the close price in this dataset, the adjusted closing price tends to give out a better idea of the overall value of the stock.
# read covid and FAANG stock datasets
covid_data = pd.read_csv("owid-covid-data.csv")
apple = pd.read_csv("Apple_5y.csv")
amazon = pd.read_csv("Amazon_5y.csv")
facebook = pd.read_csv("Facebook_5y.csv")
netflix = pd.read_csv("Netflix_5y.csv")
google = pd.read_csv("GOOGL_5y.csv")
It's a huge dataset contain worldwide case information about covid data. In this article, I only intended to analysis the impact of US covid cases and death on US stock. I picked the the period of data is from Jan 2020 to Dec 2021, apply to all datasets to keep the ensure the consistency of date's length and format. And then, I also calculated standard deviation of chosen columns, prepared for analysis between them.
# Extract us covid data from large world data
us_data = covid_data.loc[covid_data['location'] == 'United States']
# super large world covid 19 tracking data
covid = us_data.filter(['date','total_cases','new_cases','total_deaths','new_deaths'], axis=1)
covid.head()
| date | total_cases | new_cases | total_deaths | new_deaths | |
|---|---|---|---|---|---|
| 133266 | 2020-01-22 | 1.0 | NaN | NaN | NaN |
| 133267 | 2020-01-23 | 1.0 | 0.0 | NaN | NaN |
| 133268 | 2020-01-24 | 2.0 | 1.0 | NaN | NaN |
| 133269 | 2020-01-25 | 2.0 | 0.0 | NaN | NaN |
| 133270 | 2020-01-26 | 5.0 | 3.0 | NaN | NaN |
After checking the covid dataframe, I've noticed that there're some NaNs needed to be removed or replaced. Repalcing them with zero is better option so it's ready for calculations in next few steps and also avoid loosing data unnecessayly. I also found that the messiness of index needed to be fixed. Meanwhile,Because the date couldn't be calculated, I added a new column called day to count the number of days. The last thing is I personally feel it's proper to have a numble of case in float number,so I convert them to integer.
#replace nan with zero
covid = covid.fillna(0)
# change it to datatime
covid['date'] = pd.to_datetime(covid["date"])
# rest to clean index
covid = covid.reset_index(drop=True)
# Rename
covid.rename({'date': 'Date'}, axis = 1, inplace = True)
# convert to interger
covid = covid.astype({"new_cases": int, 'total_cases': int, 'total_deaths': int, 'new_deaths': int})
#
covid['Day'] = np.arange(len(covid))
covid['Day'] = covid["Day"] + 1
covid.head()
| Date | total_cases | new_cases | total_deaths | new_deaths | Day | |
|---|---|---|---|---|---|---|
| 0 | 2020-01-22 | 1 | 0 | 0 | 0 | 1 |
| 1 | 2020-01-23 | 1 | 0 | 0 | 0 | 2 |
| 2 | 2020-01-24 | 2 | 1 | 0 | 0 | 3 |
| 3 | 2020-01-25 | 2 | 0 | 0 | 0 | 4 |
| 4 | 2020-01-26 | 5 | 3 | 0 | 0 | 5 |
The datafram above is basically a clean and tide data. Now it's time to standardized daily new cases and new deaths by subtract the column mean and divide by standard deviation to compute standardized values for all columns at the same time. To standardize a dataset means to scale all of the values in the dataset such that the mean value is 0 and the standard deviation is 1. More information about standardizing could be found here.
data = covid[['new_cases', 'new_deaths', 'total_cases', 'total_deaths']]
covid[['standard_case', 'standard_death', 'std_tot_cases', 'std_tot_deaths']] = (data-data.mean())/data.std()
covid.head()
| Date | total_cases | new_cases | total_deaths | new_deaths | Day | standard_case | standard_death | std_tot_cases | std_tot_deaths | |
|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2020-01-22 | 1 | 0 | 0 | 0 | 1 | -1.104002 | -1.204305 | -1.229448 | -1.45181 |
| 1 | 2020-01-23 | 1 | 0 | 0 | 0 | 2 | -1.104002 | -1.204305 | -1.229448 | -1.45181 |
| 2 | 2020-01-24 | 2 | 1 | 0 | 0 | 3 | -1.103986 | -1.204305 | -1.229448 | -1.45181 |
| 3 | 2020-01-25 | 2 | 0 | 0 | 0 | 4 | -1.104002 | -1.204305 | -1.229448 | -1.45181 |
| 4 | 2020-01-26 | 5 | 3 | 0 | 0 | 5 | -1.103956 | -1.204305 | -1.229448 | -1.45181 |
FAANG is an acronym used to describe some of the most prominent companies in the tech sector. Originally the acronym was FANG for Facebook (NASDAQ: FB), Amazon (NASDAQ: AMZN), Netflix (NASDAQ: NFLX), and Alphabet (NASDAQ: GOOG) (NASDAQ: GOOGL) (formerly Google).
Since FAANG stands for five tech companies, I made a list for them to reduce code lines. In the pandas libraries, I have to remove the dollar sign and convert the type of close, open, high, and low to float because type str doesn't support daily abnormal stock price calculation. I define the daily abnormal FAANG stock price between January 22, 2020 and December 16, 2021 by subtracting the average price of last twenty three month from the daily price and by dividing the resultant difference from thestandard deviation of the last twenty three months.
# make a list a dataframes.
companies = [apple, amazon, facebook, netflix, google]
# convert to datetime type
for c in companies:
c['Date'] = pd.to_datetime(c["Date"])
# remove dollar sign and convert to float
for c in companies:
c['Close/Last'] = c['Close/Last'].astype(str).str.replace('$', '').astype(float)
c['Open'] = c['Open'].astype(str).str.replace('$', '').astype(float)
c['High'] = c['High'].astype(str).str.replace('$', '').astype(float)
c['Low'] = c['Low'].astype(str).str.replace('$', '').astype(float)
/tmp/ipykernel_5504/44027475.py:3: FutureWarning: The default value of regex will change from True to False in a future version. In addition, single character regular expressions will *not* be treated as literal strings when regex=True. /tmp/ipykernel_5504/44027475.py:4: FutureWarning: The default value of regex will change from True to False in a future version. In addition, single character regular expressions will *not* be treated as literal strings when regex=True. /tmp/ipykernel_5504/44027475.py:5: FutureWarning: The default value of regex will change from True to False in a future version. In addition, single character regular expressions will *not* be treated as literal strings when regex=True. /tmp/ipykernel_5504/44027475.py:6: FutureWarning: The default value of regex will change from True to False in a future version. In addition, single character regular expressions will *not* be treated as literal strings when regex=True.
# get abnormal index
for c in companies:
c['abnormal'] = (c['Close/Last'] - c['Close/Last'].mean())/c['Close/Last'].std()
c['V'] = c['Volume'].mean()/20000
c['st_open'] = (c['Open'] - c['Open'].mean())/c['Open'].std()
# filter date
amazon = amazon[amazon["Date"] >= '2020-01-22']
apple = apple[apple["Date"] >= '2020-01-22']
facebook = facebook[facebook["Date"] >= '2020-01-22']
google = google[google["Date"] >= '2020-01-22']
netflix = netflix[netflix["Date"] >= '2020-01-22']
# make index neat
amazon = amazon.sort_values(by = 'Date').reset_index(drop=True)
apple = apple.sort_values(by = 'Date').reset_index(drop=True)
google = google.sort_values(by = 'Date').reset_index(drop=True)
netflix = netflix.sort_values(by = 'Date').reset_index(drop=True)
facebook = facebook.sort_values(by = 'Date').reset_index(drop=True)
comp = ['Facebook', 'Apple', 'Amazon', 'Netflix', 'Google']
amazon.head()
| Date | Close/Last | Volume | Open | High | Low | abnormal | V | st_open | |
|---|---|---|---|---|---|---|---|---|---|
| 0 | 2020-01-22 | 1887.46 | 3216257 | 1896.09 | 1902.500 | 1883.34 | -0.204168 | 213.115061 | -0.195034 |
| 1 | 2020-01-23 | 1884.58 | 2484613 | 1885.11 | 1889.975 | 1872.76 | -0.207449 | 213.115061 | -0.207527 |
| 2 | 2020-01-24 | 1861.64 | 3766181 | 1891.37 | 1894.990 | 1847.44 | -0.233584 | 213.115061 | -0.200405 |
| 3 | 2020-01-27 | 1828.34 | 3528509 | 1820.00 | 1841.000 | 1815.34 | -0.271521 | 213.115061 | -0.281604 |
| 4 | 2020-01-28 | 1853.25 | 2808040 | 1840.50 | 1858.110 | 1830.02 | -0.243142 | 213.115061 | -0.258281 |
fig=go.Figure()
fig.add_trace(go.Scatter(x= covid.Date, y=covid["std_tot_cases"],mode='lines',name='Total Cases'))
fig.add_trace(go.Scatter(x= covid.Date, y=covid["std_tot_deaths"],mode='lines',name='Total Deaths'))
fig.update_layout(title="Covid-19 Daily Total Cases and Deaths Tracking in US", xaxis_title="Since 2020-01",yaxis_title="Number of Cases",legend=dict(x=0,y=1,traceorder="normal"))
fig.show()
From the plot above, we can tell the rate is almost in an exponential growth,there's positive relationship between total cases and deaths over time, furthermore, there's no sign to flatline.
fig=go.Figure()
fig.add_trace(go.Scatter(x= covid.Date, y=covid["standard_case"],mode='lines',name='New Case'))
fig.add_trace(go.Scatter(x= covid.Date, y=covid["standard_death"],mode='lines',name='New Death'))
fig.update_layout(title="Covid-19 Daily Death vs. Case in US",xaxis_title="Since 2020-01",yaxis_title="Standard indicator",legend=dict(x=0,y=1,traceorder="normal"))
fig.show()
The distribution of above plot shows both distrubutions of daily death and daily case. They are almost identical except the distribution of period between May 2020 and June 2020. So the distribution of daily death is multimodal and the distribution of daily case is bimodal, both are symmetric around the center. In this period, the number of daily death achieved a peak but daily case didn't.
In this section I have made five different graphs that will represent five different attributes about each and every FAANG company. These attributes are : Opening, Closing Prices, Volumes and 14, 21, 100 day moving averages. All of these are very important for investors, they are able to determine whether they should buy or sell a stock based on the values of these attributes. We decided to calculate the different moving averages because a lot of the buyers and sellers base their actions on these averages.
We have added another feature to the graph that shows the standardized volumes in the background of the primary scatterplot. The volumes have been scaled in order to help users see the volumes better. We have also added an option in the dropdown menu where users can choose to see the standardized volume histogram in much more detail.
As you can see the graphs in this section are all interactive and visual. We have made a separate plot for all attributes and a user can select which graph he or she wants to study based on their preference.
fig = go.Figure()
fig.add_trace(go.Scatter(x=facebook.Date, y=facebook['Close/Last'], name='FB'))
fig.add_trace(go.Scatter(x=apple.Date, y=apple['Close/Last'], name='AAPL'))
fig.add_trace(go.Scatter(x=amazon.Date, y=amazon['Close/Last'], name='AMZN'))
fig.add_trace(go.Scatter(x=netflix.Date, y=netflix['Close/Last'], name='NFLX'))
fig.add_trace(go.Scatter(x=google.Date, y=google['Close/Last'], name='GOOG'))
fig.update_layout(title='Close prices for All Companies from Jan 2013 to Dec 2020',
xaxis_title='Date',
yaxis_title='Close Price')
fig.update_layout(
updatemenus=[
dict(
buttons=list([
dict(label = 'All',
method = 'update',
args = [{'visible': [True, True, True, True, True]},
{'title': 'All',
'showlegend':True}]),
dict(label = 'Facebook',
method = 'update',
args = [{'visible': [True, False, False, False, False]},
{'title': 'FB',
'showlegend':True}]),
dict(label = 'Apple',
method = 'update',
args = [{'visible': [False, True, False, False, False]},
{'title': 'APPL',
'showlegend':True}]),
dict(label = 'Amazon',
method = 'update',
args = [{'visible': [False, False, True, False, False]},
{'title': 'AMZN',
'showlegend':True}]),
dict(label = 'Netflix',
method = 'update',
args = [{'visible': [False, False, False, True, False]},
{'title': 'NFLX',
'showlegend':True}]),
dict(label = 'Google',
method = 'update',
args = [{'visible': [False, False, False, False, True]},
{'title': 'GOOG',
'showlegend':True}]),
]),
direction="down",
pad={"r": 10, "t": 10},
showactive=True,
x=0.1,
xanchor="left",
y=1.1,
yanchor="top"
),
]
)
fig.update_layout(autosize=False,
width=1000,
height=650,)
iplot(fig,show_link=False)
In this section we have made five different graphs that will represent price of every FAANG company. We can tell from the plot both Amazon and Google have much higher close price than the rest three. Amazon has the highest close price, and apple has the lowest close price.
fig = go.Figure()
fig.add_trace(go.Scatter(x=facebook.Date, y=facebook['abnormal'], name='FB'))
fig.add_trace(go.Scatter(x=apple.Date, y=apple['abnormal'], name='AAPL'))
fig.add_trace(go.Scatter(x=amazon.Date, y=amazon['abnormal'], name='AMZN'))
fig.add_trace(go.Scatter(x=netflix.Date, y=netflix['abnormal'], name='NFLX'))
fig.add_trace(go.Scatter(x=google.Date, y=google['abnormal'], name='GOOG'))
fig.update_layout(title='Abnormalities of Close prices for FAANG from Jan 2020 to Dec 2021',xaxis_title='Date',yaxis_title='Close Price')
fig.update_layout(
updatemenus=[
dict(
buttons=list([
dict(label = 'All',method = 'update',
args = [{'visible': [True, True, True, True, True]},{'title': 'All','showlegend':True}]),
dict(label = 'Facebook',method = 'update',
args = [{'visible': [True, False, False, False, False]},{'title': 'FB', 'showlegend':True}]),
dict(label = 'Apple',method = 'update',
args = [{'visible': [False, True, False, False, False]},{'title': 'APPL','showlegend':True}]),
dict(label = 'Amazon', method = 'update',
args = [{'visible': [False, False, True, False, False]},{'title': 'AMZN','showlegend':True}]),
dict(label = 'Netflix',method = 'update',
args = [{'visible': [False, False, False, True, False]},{'title': 'NFLX','showlegend':True}]),
dict(label = 'Google',method = 'update',
args = [{'visible': [False, False, False, False, True]},{'title': 'GOOG','showlegend':True}]),]),
direction="down",
pad={"r": 10, "t": 10},showactive=True,x=0.1,xanchor="left",y=1.1,yanchor="top"),])
fig.update_layout(
autosize=False,
width=1000,
height=650,)
iplot(fig,show_link=False)
The plot shows FAANG companies have similar abnomalities, it implies their stock price will be affected in same way
# Calculate volatility
v = []
for c in companies:
c['Log returns'] = np.log(c['Close/Last']/c['Close/Last'].shift())
volatility = c['Log returns'].std()*252**.5
v.append(volatility)
print(v)
[0.3047438702157555, 0.2936558186461037, 0.3337439408977602, 0.3805611213043614, 0.2707517904683592]
Volatility is defined as how much variation there is in the price of a given stock or index of stocks; simply put, how widely a price can swing up or down. It is generally considered to be a measure of the level of risk in an investment. Typically, low volatility is associated with positive market returns and high volatility with negative market returns. However, volatility can be high when stocks are increasing or decreasing in value. Volatility averages is often within a range of 10-20%.
We have fairly high volatility for FAANG companies, the highest is 38% for Neflix, and the lowest is Google, which has about 27%. Even the lowest one is much higher than the normal range. So, we can expect FAANG companies's stock price might fall in the future. It's risky to invest at this moment.
AAPL = apple[["Date","abnormal", 'Close/Last']]
FB = facebook[["Date","abnormal", 'Close/Last']]
AMZN = amazon[["Date","abnormal", 'Close/Last']]
NFLX = netflix[["Date","abnormal", 'Close/Last']]
GOOG = google[["Date","abnormal", 'Close/Last']]
c_data = covid[['Date','standard_case','standard_death', 'new_cases', 'new_deaths']]
Apple = pd.merge(AAPL, c_data, on='Date', how='inner')
Facebook = pd.merge(FB, c_data, on='Date', how='inner')
Amazon = pd.merge(AMZN, c_data, on='Date', how='inner')
Netflix = pd.merge(NFLX, c_data, on='Date', how='inner')
Google = pd.merge(GOOG, c_data, on='Date', how='inner')
new = [Facebook,Apple,Amazon,Netflix,Google]
for i, j in zip(new, comp):
X = np.array(i['new_deaths']).reshape(-1,1)
y = np.array(i['Close/Last'])
X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=0)
model = LinearRegression()
model.fit(X_train, y_train)
x_range = np.linspace(X.min(), X.max(), 100)
y_range = model.predict(x_range.reshape(-1, 1))
fig = go.Figure()
fig.add_trace(go.Scatter(x=X_train.squeeze(), y=y_train, name='Training Data', mode='markers'))
fig.add_trace(go.Scatter(x=X_test.squeeze(), y=y_test, name='Testing Data', mode='markers'))
fig.add_trace(go.Scatter(x=x_range, y=y_range, name='Linear Regression'))
model = KNeighborsRegressor()
model.fit(X_train, y_train)
x_range = np.linspace(X.min(), X.max(), 100)
y_range = model.predict(x_range.reshape(-1, 1))
fig.add_trace(go.Scatter(x=x_range, y=y_range, name='kNN Regressor'))
model = DecisionTreeRegressor()
model.fit(X_train, y_train)
x_range = np.linspace(X.min(), X.max(), 100)
y_range = model.predict(x_range.reshape(-1, 1))
fig.add_trace(go.Scatter(x=x_range, y=y_range, name='Decision Tree', marker_color='gold'))
fig.update_layout(
updatemenus=[
dict(
buttons=list([
dict(label = 'All',method = 'update',
args = [{'visible': [True, True, True, True, True]},{'title': 'All','showlegend':True}]),
dict(label = 'Decision Tree Regressor',method = 'update',
args = [{'visible': [True, True, False, False, True]},{'title': 'Decision Tree Regressor for '+ j,'showlegend':True}]),
dict(label = 'Linear Regression',method = 'update',
args = [{'visible': [True, True, True, False, False]},{'title': 'Linear Regression for '+ j,'showlegend':True}]),
dict(label = 'k-NN Regressor',method = 'update',
args = [{'visible': [True, True, False, True, False]},{'title': 'k-NN Regressor for '+ j,'showlegend':True}]),]),
direction="down",
pad={"r": 10, "t": 10},showactive=True,x=0.1,xanchor="left",y=1.1,yanchor="top"),])
fig.update_layout(title='Regression Line Fit for Daily New Covid-19 Deaths vs. Daily Stock Close Price for ' + j, xaxis_title='Number of Case', yaxis_title='Close Price',autosize=False, width=1000,height=650)
iplot(fig,show_link=False)
report = ols(formula = "y ~ X", data = i).fit()
print(report.summary())
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.203
Model: OLS Adj. R-squared: 0.201
Method: Least Squares F-statistic: 122.0
Date: Tue, 21 Dec 2021 Prob (F-statistic): 2.00e-25
Time: 04:22:28 Log-Likelihood: -2574.5
No. Observations: 482 AIC: 5153.
Df Residuals: 480 BIC: 5161.
Df Model: 1
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [0.025 0.975]
------------------------------------------------------------------------------
Intercept 248.2639 3.556 69.810 0.000 241.276 255.252
X 0.0004 3.36e-05 11.045 0.000 0.000 0.000
==============================================================================
Omnibus: 55.306 Durbin-Watson: 0.045
Prob(Omnibus): 0.000 Jarque-Bera (JB): 15.735
Skew: 0.010 Prob(JB): 0.000383
Kurtosis: 2.115 Cond. No. 1.63e+05
==============================================================================
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
[2] The condition number is large, 1.63e+05. This might indicate that there are
strong multicollinearity or other numerical problems.
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.338
Model: OLS Adj. R-squared: 0.336
Method: Least Squares F-statistic: 244.8
Date: Tue, 21 Dec 2021 Prob (F-statistic): 6.89e-45
Time: 04:22:28 Log-Likelihood: -2192.9
No. Observations: 482 AIC: 4390.
Df Residuals: 480 BIC: 4398.
Df Model: 1
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [0.025 0.975]
------------------------------------------------------------------------------
Intercept 98.8336 1.611 61.342 0.000 95.668 101.999
X 0.0002 1.52e-05 15.647 0.000 0.000 0.000
==============================================================================
Omnibus: 151.781 Durbin-Watson: 0.079
Prob(Omnibus): 0.000 Jarque-Bera (JB): 24.056
Skew: -0.060 Prob(JB): 5.97e-06
Kurtosis: 1.912 Cond. No. 1.63e+05
==============================================================================
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
[2] The condition number is large, 1.63e+05. This might indicate that there are
strong multicollinearity or other numerical problems.
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.233
Model: OLS Adj. R-squared: 0.232
Method: Least Squares F-statistic: 146.0
Date: Tue, 21 Dec 2021 Prob (F-statistic): 1.59e-29
Time: 04:22:28 Log-Likelihood: -3612.0
No. Observations: 482 AIC: 7228.
Df Residuals: 480 BIC: 7236.
Df Model: 1
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [0.025 0.975]
------------------------------------------------------------------------------
Intercept 2752.6984 30.609 89.931 0.000 2692.554 2812.842
X 0.0035 0.000 12.081 0.000 0.003 0.004
==============================================================================
Omnibus: 25.741 Durbin-Watson: 0.062
Prob(Omnibus): 0.000 Jarque-Bera (JB): 24.992
Skew: -0.508 Prob(JB): 3.74e-06
Kurtosis: 2.540 Cond. No. 1.63e+05
==============================================================================
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
[2] The condition number is large, 1.63e+05. This might indicate that there are
strong multicollinearity or other numerical problems.
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.266
Model: OLS Adj. R-squared: 0.264
Method: Least Squares F-statistic: 173.9
Date: Tue, 21 Dec 2021 Prob (F-statistic): 4.15e-34
Time: 04:22:28 Log-Likelihood: -2711.9
No. Observations: 482 AIC: 5428.
Df Residuals: 480 BIC: 5436.
Df Model: 1
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [0.025 0.975]
------------------------------------------------------------------------------
Intercept 457.2743 4.730 96.679 0.000 447.981 466.568
X 0.0006 4.47e-05 13.188 0.000 0.001 0.001
==============================================================================
Omnibus: 6.268 Durbin-Watson: 0.077
Prob(Omnibus): 0.044 Jarque-Bera (JB): 6.303
Skew: 0.280 Prob(JB): 0.0428
Kurtosis: 2.985 Cond. No. 1.63e+05
==============================================================================
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
[2] The condition number is large, 1.63e+05. This might indicate that there are
strong multicollinearity or other numerical problems.
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.172
Model: OLS Adj. R-squared: 0.170
Method: Least Squares F-statistic: 99.48
Date: Tue, 21 Dec 2021 Prob (F-statistic): 2.04e-21
Time: 04:22:28 Log-Likelihood: -3697.0
No. Observations: 482 AIC: 7398.
Df Residuals: 480 BIC: 7406.
Df Model: 1
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [0.025 0.975]
------------------------------------------------------------------------------
Intercept 1699.4049 36.508 46.549 0.000 1627.670 1771.140
X 0.0034 0.000 9.974 0.000 0.003 0.004
==============================================================================
Omnibus: 1581.892 Durbin-Watson: 0.031
Prob(Omnibus): 0.000 Jarque-Bera (JB): 44.516
Skew: 0.312 Prob(JB): 2.16e-10
Kurtosis: 1.648 Cond. No. 1.63e+05
==============================================================================
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
[2] The condition number is large, 1.63e+05. This might indicate that there are
strong multicollinearity or other numerical problems.
Hypothesis Testing to Check for Relationship Between daily death and Close Price for FAANG: Hypothesis Test is conducted to see if there is a relationship between daily death and Close Price for FAANG at a 95% Confidence Interval
Null Hypothesis: There is no relationship between daily death and Close Price for FAANG Alternative Hypothesis: There is relationship between time and Close Price for FAANG
If the p-value is greater than 0.05, we fail to reject the null hypothesis. If the p-value is smaller than 0.05, we reject the null hypothesis and accept the alternative hypothesis
The p - value we get is 0.000, which is smaller than the p - value, so we reject the null hypothesis and accept the alternative hypothesis.
As the null hypothesis is rejected, we can conclude that there is a relationship between daily death and Close Price of FAANG.
for i, j in zip(new, comp):
X = np.array(i['new_cases']).reshape(-1,1)
y = np.array(i['Close/Last'])
X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=0)
model = LinearRegression()
model.fit(X_train, y_train)
x_range = np.linspace(X.min(), X.max(), 100)
y_range = model.predict(x_range.reshape(-1, 1))
fig = go.Figure()
fig.add_trace(go.Scatter(x=X_train.squeeze(), y=y_train, name='Training Data', mode='markers'))
fig.add_trace(go.Scatter(x=X_test.squeeze(), y=y_test, name='Testing Data', mode='markers'))
fig.add_trace(go.Scatter(x=x_range, y=y_range, name='Linear Regression'))
model = KNeighborsRegressor()
model.fit(X_train, y_train)
x_range = np.linspace(X.min(), X.max(), 100)
y_range = model.predict(x_range.reshape(-1, 1))
fig.add_trace(go.Scatter(x=x_range, y=y_range, name='kNN Regressor'))
model = DecisionTreeRegressor()
model.fit(X_train, y_train)
x_range = np.linspace(X.min(), X.max(), 100)
y_range = model.predict(x_range.reshape(-1, 1))
fig.add_trace(go.Scatter(x=x_range, y=y_range, name='Decision Tree', marker_color='gold'))
fig.update_layout(
updatemenus=[
dict(
buttons=list([
dict(label = 'All',method = 'update',
args = [{'visible': [True, True, True, True, True]},{'title': 'All','showlegend':True}]),
dict(label = 'Decision Tree Regressor',method = 'update',
args = [{'visible': [True, True, False, False, True]},{'title': 'Decision Tree Regressor for '+ j,'showlegend':True}]),
dict(label = 'Linear Regression',method = 'update',
args = [{'visible': [True, True, True, False, False]},{'title': 'Linear Regression for '+ j,'showlegend':True}]),
dict(label = 'k-NN Regressor',method = 'update',
args = [{'visible': [True, True, False, True, False]},{'title': 'k-NN Regressor for '+ j,'showlegend':True}]),]),
direction="down",
pad={"r": 10, "t": 10},showactive=True,x=0.1,xanchor="left",y=1.1,yanchor="top"),])
fig.update_layout(title='Regression Line Fit for Daily New Covid-19 Cases vs. Daily Stock Close Price for ' + j, xaxis_title='Number of Case', yaxis_title='Close Price',autosize=False, width=1000,height=650)
iplot(fig,show_link=False)
report = ols(formula = "y ~ X", data = i).fit()
print(report.summary())
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.203
Model: OLS Adj. R-squared: 0.201
Method: Least Squares F-statistic: 122.0
Date: Tue, 21 Dec 2021 Prob (F-statistic): 2.00e-25
Time: 04:34:21 Log-Likelihood: -2574.5
No. Observations: 482 AIC: 5153.
Df Residuals: 480 BIC: 5161.
Df Model: 1
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [0.025 0.975]
------------------------------------------------------------------------------
Intercept 248.2639 3.556 69.810 0.000 241.276 255.252
X 0.0004 3.36e-05 11.045 0.000 0.000 0.000
==============================================================================
Omnibus: 55.306 Durbin-Watson: 0.045
Prob(Omnibus): 0.000 Jarque-Bera (JB): 15.735
Skew: 0.010 Prob(JB): 0.000383
Kurtosis: 2.115 Cond. No. 1.63e+05
==============================================================================
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
[2] The condition number is large, 1.63e+05. This might indicate that there are
strong multicollinearity or other numerical problems.
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.338
Model: OLS Adj. R-squared: 0.336
Method: Least Squares F-statistic: 244.8
Date: Tue, 21 Dec 2021 Prob (F-statistic): 6.89e-45
Time: 04:34:21 Log-Likelihood: -2192.9
No. Observations: 482 AIC: 4390.
Df Residuals: 480 BIC: 4398.
Df Model: 1
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [0.025 0.975]
------------------------------------------------------------------------------
Intercept 98.8336 1.611 61.342 0.000 95.668 101.999
X 0.0002 1.52e-05 15.647 0.000 0.000 0.000
==============================================================================
Omnibus: 151.781 Durbin-Watson: 0.079
Prob(Omnibus): 0.000 Jarque-Bera (JB): 24.056
Skew: -0.060 Prob(JB): 5.97e-06
Kurtosis: 1.912 Cond. No. 1.63e+05
==============================================================================
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
[2] The condition number is large, 1.63e+05. This might indicate that there are
strong multicollinearity or other numerical problems.
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.233
Model: OLS Adj. R-squared: 0.232
Method: Least Squares F-statistic: 146.0
Date: Tue, 21 Dec 2021 Prob (F-statistic): 1.59e-29
Time: 04:34:21 Log-Likelihood: -3612.0
No. Observations: 482 AIC: 7228.
Df Residuals: 480 BIC: 7236.
Df Model: 1
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [0.025 0.975]
------------------------------------------------------------------------------
Intercept 2752.6984 30.609 89.931 0.000 2692.554 2812.842
X 0.0035 0.000 12.081 0.000 0.003 0.004
==============================================================================
Omnibus: 25.741 Durbin-Watson: 0.062
Prob(Omnibus): 0.000 Jarque-Bera (JB): 24.992
Skew: -0.508 Prob(JB): 3.74e-06
Kurtosis: 2.540 Cond. No. 1.63e+05
==============================================================================
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
[2] The condition number is large, 1.63e+05. This might indicate that there are
strong multicollinearity or other numerical problems.
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.266
Model: OLS Adj. R-squared: 0.264
Method: Least Squares F-statistic: 173.9
Date: Tue, 21 Dec 2021 Prob (F-statistic): 4.15e-34
Time: 04:34:21 Log-Likelihood: -2711.9
No. Observations: 482 AIC: 5428.
Df Residuals: 480 BIC: 5436.
Df Model: 1
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [0.025 0.975]
------------------------------------------------------------------------------
Intercept 457.2743 4.730 96.679 0.000 447.981 466.568
X 0.0006 4.47e-05 13.188 0.000 0.001 0.001
==============================================================================
Omnibus: 6.268 Durbin-Watson: 0.077
Prob(Omnibus): 0.044 Jarque-Bera (JB): 6.303
Skew: 0.280 Prob(JB): 0.0428
Kurtosis: 2.985 Cond. No. 1.63e+05
==============================================================================
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
[2] The condition number is large, 1.63e+05. This might indicate that there are
strong multicollinearity or other numerical problems.
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.172
Model: OLS Adj. R-squared: 0.170
Method: Least Squares F-statistic: 99.48
Date: Tue, 21 Dec 2021 Prob (F-statistic): 2.04e-21
Time: 04:34:21 Log-Likelihood: -3697.0
No. Observations: 482 AIC: 7398.
Df Residuals: 480 BIC: 7406.
Df Model: 1
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [0.025 0.975]
------------------------------------------------------------------------------
Intercept 1699.4049 36.508 46.549 0.000 1627.670 1771.140
X 0.0034 0.000 9.974 0.000 0.003 0.004
==============================================================================
Omnibus: 1581.892 Durbin-Watson: 0.031
Prob(Omnibus): 0.000 Jarque-Bera (JB): 44.516
Skew: 0.312 Prob(JB): 2.16e-10
Kurtosis: 1.648 Cond. No. 1.63e+05
==============================================================================
Notes:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
[2] The condition number is large, 1.63e+05. This might indicate that there are
strong multicollinearity or other numerical problems.
#### Hypothesis Testing:
Hypothesis Testing to Check for Relationship Between daily case and Close Price for FAANG:
Hypothesis Test is conducted to see if there is a relationship between daily case and Close Price for FAANG at a 95% Confidence Interval
Null Hypothesis: There is no relationship between daily case and Close Price for FAANG Alternative Hypothesis: There is relationship between time and Close Price for FAANG
If the p-value is greater than 0.05, we fail to reject the null hypothesis.
If the p-value is smaller than 0.05, we reject the null hypothesis and accept the alternative hypothesis
The p - value we get is 0.000, which is smaller than the p - value, so we reject the null hypothesis and accept the alternative hypothesis.
As the null hypothesis is rejected, we can conclude that there is a relationship between daily cases and Close Price of FAANG.
#Set Target Variable
output_var = pd.DataFrame(apple['Close/Last'])
#Selecting the Features
features = ['Open', 'High', 'Low', 'Volume']
#Scaling
scaler = StandardScaler()
feature_transform = scaler.fit_transform(apple[features])
feature_transform= pd.DataFrame(columns=features, data=feature_transform, index=apple.index)
#Splitting to Training set and Test set
timesplit= TimeSeriesSplit(n_splits=10)
for train_index, test_index in timesplit.split(feature_transform):
X_train, X_test = feature_transform[:len(train_index)], feature_transform[len(train_index): (len(train_index)+len(test_index))]
y_train, y_test = output_var[:len(train_index)].values.ravel(), output_var[len(train_index): (len(train_index)+len(test_index))].values.ravel()
#Process the data for LSTM
trainX =np.array(X_train)
testX =np.array(X_test)
X_train = trainX.reshape(X_train.shape[0], 1, X_train.shape[1])
X_test = testX.reshape(X_test.shape[0], 1, X_test.shape[1])
#Building the LSTM Model
lstm = Sequential()
lstm.add(LSTM(32, input_shape=(1, trainX.shape[1]), activation='relu', return_sequences=False))
lstm.add(Dense(1))
lstm.compile(loss='mean_squared_error', optimizer='adam')
plot_model(lstm, show_shapes=True, show_layer_names=True)
2021-12-21 04:22:28.456399: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcuda.so.1'; dlerror: libcuda.so.1: cannot open shared object file: No such file or directory 2021-12-21 04:22:28.456467: W tensorflow/stream_executor/cuda/cuda_driver.cc:269] failed call to cuInit: UNKNOWN ERROR (303) 2021-12-21 04:22:28.456511: I tensorflow/stream_executor/cuda/cuda_diagnostics.cc:156] kernel driver does not appear to be running on this host (638e88eadd24): /proc/driver/nvidia/version does not exist 2021-12-21 04:22:28.456863: I tensorflow/core/platform/cpu_feature_guard.cc:151] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
('You must install pydot (`pip install pydot`) and install graphviz (see instructions at https://graphviz.gitlab.io/download/) ', 'for plot_model/model_to_dot to work.')
history=lstm.fit(X_train, y_train, epochs=100, batch_size=8, verbose=1, shuffle=False)
Epoch 1/100 55/55 [==============================] - 1s 3ms/step - loss: 13683.1504 Epoch 2/100 55/55 [==============================] - 0s 2ms/step - loss: 13626.4355 Epoch 3/100 55/55 [==============================] - 0s 2ms/step - loss: 13543.6016 Epoch 4/100 55/55 [==============================] - 0s 2ms/step - loss: 13414.1914 Epoch 5/100 55/55 [==============================] - 0s 2ms/step - loss: 13221.8848 Epoch 6/100 55/55 [==============================] - 0s 2ms/step - loss: 12955.1787 Epoch 7/100 55/55 [==============================] - 0s 2ms/step - loss: 12610.2480 Epoch 8/100 55/55 [==============================] - 0s 2ms/step - loss: 12191.0205 Epoch 9/100 55/55 [==============================] - 0s 2ms/step - loss: 11706.9170 Epoch 10/100 55/55 [==============================] - 0s 2ms/step - loss: 11170.2354 Epoch 11/100 55/55 [==============================] - 0s 2ms/step - loss: 10594.0439 Epoch 12/100 55/55 [==============================] - 0s 2ms/step - loss: 9991.2051 Epoch 13/100 55/55 [==============================] - 0s 2ms/step - loss: 9373.4951 Epoch 14/100 55/55 [==============================] - 0s 2ms/step - loss: 8751.4277 Epoch 15/100 55/55 [==============================] - 0s 2ms/step - loss: 8134.3608 Epoch 16/100 55/55 [==============================] - 0s 2ms/step - loss: 7530.4629 Epoch 17/100 55/55 [==============================] - 0s 2ms/step - loss: 6946.5342 Epoch 18/100 55/55 [==============================] - 0s 2ms/step - loss: 6387.0962 Epoch 19/100 55/55 [==============================] - 0s 2ms/step - loss: 5851.1509 Epoch 20/100 55/55 [==============================] - 0s 2ms/step - loss: 5340.3936 Epoch 21/100 55/55 [==============================] - 0s 2ms/step - loss: 4861.6191 Epoch 22/100 55/55 [==============================] - 0s 2ms/step - loss: 4420.4956 Epoch 23/100 55/55 [==============================] - 0s 2ms/step - loss: 4019.8784 Epoch 24/100 55/55 [==============================] - 0s 2ms/step - loss: 3660.4226 Epoch 25/100 55/55 [==============================] - 0s 2ms/step - loss: 3341.3494 Epoch 26/100 55/55 [==============================] - 0s 2ms/step - loss: 3060.8806 Epoch 27/100 55/55 [==============================] - 0s 2ms/step - loss: 2816.5173 Epoch 28/100 55/55 [==============================] - 0s 2ms/step - loss: 2605.2449 Epoch 29/100 55/55 [==============================] - 0s 2ms/step - loss: 2423.7051 Epoch 30/100 55/55 [==============================] - 0s 2ms/step - loss: 2268.3816 Epoch 31/100 55/55 [==============================] - 0s 2ms/step - loss: 2135.6396 Epoch 32/100 55/55 [==============================] - 0s 2ms/step - loss: 2021.9530 Epoch 33/100 55/55 [==============================] - 0s 2ms/step - loss: 1923.9098 Epoch 34/100 55/55 [==============================] - 0s 2ms/step - loss: 1838.5024 Epoch 35/100 55/55 [==============================] - 0s 2ms/step - loss: 1763.1519 Epoch 36/100 55/55 [==============================] - 0s 2ms/step - loss: 1695.6172 Epoch 37/100 55/55 [==============================] - 0s 2ms/step - loss: 1634.0695 Epoch 38/100 55/55 [==============================] - 0s 2ms/step - loss: 1577.1283 Epoch 39/100 55/55 [==============================] - 0s 2ms/step - loss: 1523.7355 Epoch 40/100 55/55 [==============================] - 0s 2ms/step - loss: 1473.1232 Epoch 41/100 55/55 [==============================] - 0s 2ms/step - loss: 1424.6578 Epoch 42/100 55/55 [==============================] - 0s 2ms/step - loss: 1377.9187 Epoch 43/100 55/55 [==============================] - 0s 2ms/step - loss: 1332.6172 Epoch 44/100 55/55 [==============================] - 0s 2ms/step - loss: 1288.5714 Epoch 45/100 55/55 [==============================] - 0s 2ms/step - loss: 1245.6694 Epoch 46/100 55/55 [==============================] - 0s 2ms/step - loss: 1203.7917 Epoch 47/100 55/55 [==============================] - 0s 2ms/step - loss: 1162.8383 Epoch 48/100 55/55 [==============================] - 0s 2ms/step - loss: 1122.7427 Epoch 49/100 55/55 [==============================] - 0s 2ms/step - loss: 1083.5259 Epoch 50/100 55/55 [==============================] - 0s 2ms/step - loss: 1045.1927 Epoch 51/100 55/55 [==============================] - 0s 2ms/step - loss: 1007.6730 Epoch 52/100 55/55 [==============================] - 0s 2ms/step - loss: 970.9231 Epoch 53/100 55/55 [==============================] - 0s 2ms/step - loss: 934.9771 Epoch 54/100 55/55 [==============================] - 0s 2ms/step - loss: 899.8396 Epoch 55/100 55/55 [==============================] - 0s 2ms/step - loss: 865.4620 Epoch 56/100 55/55 [==============================] - 0s 2ms/step - loss: 831.8810 Epoch 57/100 55/55 [==============================] - 0s 2ms/step - loss: 799.1518 Epoch 58/100 55/55 [==============================] - 0s 2ms/step - loss: 767.2265 Epoch 59/100 55/55 [==============================] - 0s 2ms/step - loss: 736.0966 Epoch 60/100 55/55 [==============================] - 0s 2ms/step - loss: 705.7912 Epoch 61/100 55/55 [==============================] - 0s 2ms/step - loss: 676.3131 Epoch 62/100 55/55 [==============================] - 0s 2ms/step - loss: 647.6531 Epoch 63/100 55/55 [==============================] - 0s 2ms/step - loss: 619.7843 Epoch 64/100 55/55 [==============================] - 0s 2ms/step - loss: 592.6887 Epoch 65/100 55/55 [==============================] - 0s 2ms/step - loss: 566.3553 Epoch 66/100 55/55 [==============================] - 0s 2ms/step - loss: 540.7659 Epoch 67/100 55/55 [==============================] - 0s 2ms/step - loss: 515.9185 Epoch 68/100 55/55 [==============================] - 0s 2ms/step - loss: 491.8134 Epoch 69/100 55/55 [==============================] - 0s 2ms/step - loss: 468.4390 Epoch 70/100 55/55 [==============================] - 0s 2ms/step - loss: 445.7560 Epoch 71/100 55/55 [==============================] - 0s 2ms/step - loss: 423.7899 Epoch 72/100 55/55 [==============================] - 0s 2ms/step - loss: 402.5547 Epoch 73/100 55/55 [==============================] - 0s 2ms/step - loss: 382.0555 Epoch 74/100 55/55 [==============================] - 0s 2ms/step - loss: 362.2745 Epoch 75/100 55/55 [==============================] - 0s 2ms/step - loss: 343.2064 Epoch 76/100 55/55 [==============================] - 0s 2ms/step - loss: 324.8521 Epoch 77/100 55/55 [==============================] - 0s 2ms/step - loss: 307.2565 Epoch 78/100 55/55 [==============================] - 0s 2ms/step - loss: 290.3707 Epoch 79/100 55/55 [==============================] - 0s 2ms/step - loss: 274.2199 Epoch 80/100 55/55 [==============================] - 0s 2ms/step - loss: 258.8301 Epoch 81/100 55/55 [==============================] - 0s 2ms/step - loss: 244.1674 Epoch 82/100 55/55 [==============================] - 0s 2ms/step - loss: 230.1989 Epoch 83/100 55/55 [==============================] - 0s 2ms/step - loss: 216.9115 Epoch 84/100 55/55 [==============================] - 0s 2ms/step - loss: 204.2743 Epoch 85/100 55/55 [==============================] - 0s 2ms/step - loss: 192.2775 Epoch 86/100 55/55 [==============================] - 0s 2ms/step - loss: 180.9188 Epoch 87/100 55/55 [==============================] - 0s 2ms/step - loss: 170.1756 Epoch 88/100 55/55 [==============================] - 0s 2ms/step - loss: 160.0567 Epoch 89/100 55/55 [==============================] - 0s 2ms/step - loss: 150.5423 Epoch 90/100 55/55 [==============================] - 0s 2ms/step - loss: 141.6057 Epoch 91/100 55/55 [==============================] - 0s 2ms/step - loss: 133.2287 Epoch 92/100 55/55 [==============================] - 0s 2ms/step - loss: 125.4077 Epoch 93/100 55/55 [==============================] - 0s 2ms/step - loss: 118.0872 Epoch 94/100 55/55 [==============================] - 0s 2ms/step - loss: 111.2707 Epoch 95/100 55/55 [==============================] - 0s 2ms/step - loss: 104.9148 Epoch 96/100 55/55 [==============================] - 0s 2ms/step - loss: 98.9697 Epoch 97/100 55/55 [==============================] - 0s 2ms/step - loss: 93.4329 Epoch 98/100 55/55 [==============================] - 0s 2ms/step - loss: 88.2774 Epoch 99/100 55/55 [==============================] - 0s 2ms/step - loss: 83.4732 Epoch 100/100 55/55 [==============================] - 0s 2ms/step - loss: 79.0087
#LSTM Prediction
y_pred= lstm.predict(X_test)
plt.figure(figsize=(20, 10))
plt.plot(y_test, label='True Value')
plt.plot(y_pred, label='LSTM Value')
plt.title("Prediction by LSTM For Apple", size = 24)
plt.xlabel('Time Scale'size = 14)
plt.ylabel('Scaled USD', size = 14)
plt.legend()
plt.show()
#Set Target Variable
output_var = pd.DataFrame(google['Close/Last'])
#Selecting the Features
features = ['Open', 'High', 'Low', 'Volume']
#Scaling
scaler = StandardScaler()
feature_transform = scaler.fit_transform(google[features])
feature_transform= pd.DataFrame(columns=features, data=feature_transform, index=google.index)
feature_transform.head()
#Splitting to Training set and Test set
timesplit= TimeSeriesSplit(n_splits=10)
for train_index, test_index in timesplit.split(feature_transform):
X_train, X_test = feature_transform[:len(train_index)], feature_transform[len(train_index): (len(train_index)+len(test_index))]
y_train, y_test = output_var[:len(train_index)].values.ravel(), output_var[len(train_index): (len(train_index)+len(test_index))].values.ravel()
#Process the data for LSTM
trainX =np.array(X_train)
testX =np.array(X_test)
X_train = trainX.reshape(X_train.shape[0], 1, X_train.shape[1])
X_test = testX.reshape(X_test.shape[0], 1, X_test.shape[1])
#Building the LSTM Model
lstm = Sequential()
lstm.add(LSTM(32, input_shape=(1, trainX.shape[1]), activation='relu', return_sequences=False))
lstm.add(Dense(1))
lstm.compile(loss='mean_squared_error', optimizer='adam')
plot_model(lstm, show_shapes=True, show_layer_names=True)
history=lstm.fit(X_train, y_train, epochs=100, batch_size=8, verbose=1, shuffle=False)
#LSTM Prediction
y_pred= lstm.predict(X_test)
('You must install pydot (`pip install pydot`) and install graphviz (see instructions at https://graphviz.gitlab.io/download/) ', 'for plot_model/model_to_dot to work.')
Epoch 1/100
55/55 [==============================] - 1s 2ms/step - loss: 3818536.2500
Epoch 2/100
55/55 [==============================] - 0s 2ms/step - loss: 3817693.2500
Epoch 3/100
55/55 [==============================] - 0s 2ms/step - loss: 3816495.2500
Epoch 4/100
55/55 [==============================] - 0s 2ms/step - loss: 3814560.2500
Epoch 5/100
55/55 [==============================] - 0s 2ms/step - loss: 3811552.7500
Epoch 6/100
55/55 [==============================] - 0s 2ms/step - loss: 3807215.5000
Epoch 7/100
55/55 [==============================] - 0s 2ms/step - loss: 3801417.7500
Epoch 8/100
55/55 [==============================] - 0s 2ms/step - loss: 3794138.0000
Epoch 9/100
55/55 [==============================] - 0s 2ms/step - loss: 3785407.5000
Epoch 10/100
55/55 [==============================] - 0s 2ms/step - loss: 3775299.0000
Epoch 11/100
55/55 [==============================] - 0s 2ms/step - loss: 3763919.5000
Epoch 12/100
55/55 [==============================] - 0s 2ms/step - loss: 3751383.5000
Epoch 13/100
55/55 [==============================] - 0s 2ms/step - loss: 3737801.2500
Epoch 14/100
55/55 [==============================] - 0s 2ms/step - loss: 3723266.5000
Epoch 15/100
55/55 [==============================] - 0s 2ms/step - loss: 3707858.2500
Epoch 16/100
55/55 [==============================] - 0s 2ms/step - loss: 3691643.0000
Epoch 17/100
55/55 [==============================] - 0s 2ms/step - loss: 3674674.5000
Epoch 18/100
55/55 [==============================] - 0s 2ms/step - loss: 3656998.7500
Epoch 19/100
55/55 [==============================] - 0s 2ms/step - loss: 3638656.5000
Epoch 20/100
55/55 [==============================] - 0s 3ms/step - loss: 3619683.2500
Epoch 21/100
55/55 [==============================] - 0s 2ms/step - loss: 3600110.7500
Epoch 22/100
55/55 [==============================] - 0s 2ms/step - loss: 3579969.5000
Epoch 23/100
55/55 [==============================] - 0s 2ms/step - loss: 3559287.7500
Epoch 24/100
55/55 [==============================] - 0s 2ms/step - loss: 3538089.7500
Epoch 25/100
55/55 [==============================] - 0s 2ms/step - loss: 3516400.7500
Epoch 26/100
55/55 [==============================] - 0s 2ms/step - loss: 3494245.2500
Epoch 27/100
55/55 [==============================] - 0s 2ms/step - loss: 3471643.7500
Epoch 28/100
55/55 [==============================] - 0s 2ms/step - loss: 3448619.7500
Epoch 29/100
55/55 [==============================] - 0s 2ms/step - loss: 3425193.0000
Epoch 30/100
55/55 [==============================] - 0s 2ms/step - loss: 3401386.0000
Epoch 31/100
55/55 [==============================] - 0s 1ms/step - loss: 3377217.2500
Epoch 32/100
55/55 [==============================] - 0s 2ms/step - loss: 3352707.5000
Epoch 33/100
55/55 [==============================] - 0s 2ms/step - loss: 3327875.2500
Epoch 34/100
55/55 [==============================] - 0s 2ms/step - loss: 3302740.5000
Epoch 35/100
55/55 [==============================] - 0s 2ms/step - loss: 3277321.0000
Epoch 36/100
55/55 [==============================] - 0s 2ms/step - loss: 3251635.0000
Epoch 37/100
55/55 [==============================] - 0s 2ms/step - loss: 3225701.7500
Epoch 38/100
55/55 [==============================] - 0s 2ms/step - loss: 3199537.5000
Epoch 39/100
55/55 [==============================] - 0s 2ms/step - loss: 3173159.7500
Epoch 40/100
55/55 [==============================] - 0s 2ms/step - loss: 3146585.7500
Epoch 41/100
55/55 [==============================] - 0s 2ms/step - loss: 3119833.2500
Epoch 42/100
55/55 [==============================] - 0s 2ms/step - loss: 3092917.7500
Epoch 43/100
55/55 [==============================] - 0s 2ms/step - loss: 3065855.5000
Epoch 44/100
55/55 [==============================] - 0s 2ms/step - loss: 3038662.7500
Epoch 45/100
55/55 [==============================] - 0s 2ms/step - loss: 3011355.0000
Epoch 46/100
55/55 [==============================] - 0s 2ms/step - loss: 2983948.2500
Epoch 47/100
55/55 [==============================] - 0s 2ms/step - loss: 2956456.7500
Epoch 48/100
55/55 [==============================] - 0s 2ms/step - loss: 2928896.7500
Epoch 49/100
55/55 [==============================] - 0s 2ms/step - loss: 2901282.0000
Epoch 50/100
55/55 [==============================] - 0s 2ms/step - loss: 2873627.0000
Epoch 51/100
55/55 [==============================] - 0s 2ms/step - loss: 2845946.0000
Epoch 52/100
55/55 [==============================] - 0s 3ms/step - loss: 2818252.5000
Epoch 53/100
55/55 [==============================] - 0s 2ms/step - loss: 2790559.7500
Epoch 54/100
55/55 [==============================] - 0s 3ms/step - loss: 2762882.2500
Epoch 55/100
55/55 [==============================] - 0s 2ms/step - loss: 2735232.0000
Epoch 56/100
55/55 [==============================] - 0s 2ms/step - loss: 2707621.5000
Epoch 57/100
55/55 [==============================] - 0s 2ms/step - loss: 2680063.7500
Epoch 58/100
55/55 [==============================] - 0s 2ms/step - loss: 2652570.5000
Epoch 59/100
55/55 [==============================] - 0s 2ms/step - loss: 2625153.5000
Epoch 60/100
55/55 [==============================] - 0s 2ms/step - loss: 2597824.2500
Epoch 61/100
55/55 [==============================] - 0s 2ms/step - loss: 2570593.7500
Epoch 62/100
55/55 [==============================] - 0s 2ms/step - loss: 2543471.7500
Epoch 63/100
55/55 [==============================] - 0s 2ms/step - loss: 2516469.5000
Epoch 64/100
55/55 [==============================] - 0s 2ms/step - loss: 2489597.5000
Epoch 65/100
55/55 [==============================] - 0s 2ms/step - loss: 2462865.5000
Epoch 66/100
55/55 [==============================] - 0s 2ms/step - loss: 2436282.0000
Epoch 67/100
55/55 [==============================] - 0s 2ms/step - loss: 2409856.2500
Epoch 68/100
55/55 [==============================] - 0s 2ms/step - loss: 2383597.2500
Epoch 69/100
55/55 [==============================] - 0s 2ms/step - loss: 2357512.5000
Epoch 70/100
55/55 [==============================] - 0s 2ms/step - loss: 2331610.5000
Epoch 71/100
55/55 [==============================] - 0s 3ms/step - loss: 2305897.7500
Epoch 72/100
55/55 [==============================] - 0s 2ms/step - loss: 2280382.5000
Epoch 73/100
55/55 [==============================] - 0s 2ms/step - loss: 2255069.7500
Epoch 74/100
55/55 [==============================] - 0s 2ms/step - loss: 2229965.7500
Epoch 75/100
55/55 [==============================] - 0s 2ms/step - loss: 2205076.5000
Epoch 76/100
55/55 [==============================] - 0s 2ms/step - loss: 2180406.5000
Epoch 77/100
55/55 [==============================] - 0s 2ms/step - loss: 2155961.5000
Epoch 78/100
55/55 [==============================] - 0s 2ms/step - loss: 2131745.0000
Epoch 79/100
55/55 [==============================] - 0s 2ms/step - loss: 2107761.2500
Epoch 80/100
55/55 [==============================] - 0s 2ms/step - loss: 2084012.3750
Epoch 81/100
55/55 [==============================] - 0s 2ms/step - loss: 2060502.0000
Epoch 82/100
55/55 [==============================] - 0s 2ms/step - loss: 2037232.0000
Epoch 83/100
55/55 [==============================] - 0s 2ms/step - loss: 2014204.3750
Epoch 84/100
55/55 [==============================] - 0s 2ms/step - loss: 1991419.7500
Epoch 85/100
55/55 [==============================] - 0s 2ms/step - loss: 1968879.1250
Epoch 86/100
55/55 [==============================] - 0s 2ms/step - loss: 1946582.8750
Epoch 87/100
55/55 [==============================] - 0s 2ms/step - loss: 1924530.5000
Epoch 88/100
55/55 [==============================] - 0s 3ms/step - loss: 1902721.1250
Epoch 89/100
55/55 [==============================] - 0s 2ms/step - loss: 1881154.1250
Epoch 90/100
55/55 [==============================] - 0s 2ms/step - loss: 1859827.0000
Epoch 91/100
55/55 [==============================] - 0s 2ms/step - loss: 1838738.6250
Epoch 92/100
55/55 [==============================] - 0s 2ms/step - loss: 1817885.8750
Epoch 93/100
55/55 [==============================] - 0s 2ms/step - loss: 1797265.6250
Epoch 94/100
55/55 [==============================] - 0s 2ms/step - loss: 1776875.2500
Epoch 95/100
55/55 [==============================] - 0s 2ms/step - loss: 1756710.3750
Epoch 96/100
55/55 [==============================] - 0s 2ms/step - loss: 1736767.0000
Epoch 97/100
55/55 [==============================] - 0s 2ms/step - loss: 1717041.0000
Epoch 98/100
55/55 [==============================] - 0s 2ms/step - loss: 1697528.0000
Epoch 99/100
55/55 [==============================] - 0s 2ms/step - loss: 1678224.1250
Epoch 100/100
55/55 [==============================] - 0s 3ms/step - loss: 1659123.6250
plt.figure(figsize=(20, 10))
plt.plot(y_test, label='True Value')
plt.plot(y_pred, label='LSTM Value')
plt.title("Prediction by LSTM For Google", size = 24)
plt.xlabel('Time Scale', size = 14)
plt.ylabel('Scaled USD', size = 14)
plt.legend()
plt.show()
#Set Target Variable
output_var = pd.DataFrame(amazon['Close/Last'])
#Selecting the Features
features = ['Open', 'High', 'Low', 'Volume']
#Scaling
scaler = StandardScaler()
feature_transform = scaler.fit_transform(amazon[features])
feature_transform= pd.DataFrame(columns=features, data=feature_transform, index= amazon.index)
feature_transform.head()
#Splitting to Training set and Test set
timesplit= TimeSeriesSplit(n_splits=10)
for train_index, test_index in timesplit.split(feature_transform):
X_train, X_test = feature_transform[:len(train_index)], feature_transform[len(train_index): (len(train_index)+len(test_index))]
y_train, y_test = output_var[:len(train_index)].values.ravel(), output_var[len(train_index): (len(train_index)+len(test_index))].values.ravel()
#Process the data for LSTM
trainX =np.array(X_train)
testX =np.array(X_test)
X_train = trainX.reshape(X_train.shape[0], 1, X_train.shape[1])
X_test = testX.reshape(X_test.shape[0], 1, X_test.shape[1])
#Building the LSTM Model
lstm = Sequential()
lstm.add(LSTM(32, input_shape=(1, trainX.shape[1]), activation='relu', return_sequences=False))
lstm.add(Dense(1))
lstm.compile(loss='mean_squared_error', optimizer='adam')
plot_model(lstm, show_shapes=True, show_layer_names=True)
history=lstm.fit(X_train, y_train, epochs=100, batch_size=8, verbose=1, shuffle=False)
#LSTM Prediction
y_pred= lstm.predict(X_test)
('You must install pydot (`pip install pydot`) and install graphviz (see instructions at https://graphviz.gitlab.io/download/) ', 'for plot_model/model_to_dot to work.')
Epoch 1/100
55/55 [==============================] - 1s 3ms/step - loss: 9199688.0000
Epoch 2/100
55/55 [==============================] - 0s 2ms/step - loss: 9198580.0000
Epoch 3/100
55/55 [==============================] - 0s 2ms/step - loss: 9197118.0000
Epoch 4/100
55/55 [==============================] - 0s 2ms/step - loss: 9194900.0000
Epoch 5/100
55/55 [==============================] - 0s 2ms/step - loss: 9191597.0000
Epoch 6/100
55/55 [==============================] - 0s 2ms/step - loss: 9186924.0000
Epoch 7/100
55/55 [==============================] - 0s 2ms/step - loss: 9180624.0000
Epoch 8/100
55/55 [==============================] - 0s 2ms/step - loss: 9172520.0000
Epoch 9/100
55/55 [==============================] - 0s 2ms/step - loss: 9162555.0000
Epoch 10/100
55/55 [==============================] - 0s 2ms/step - loss: 9150776.0000
Epoch 11/100
55/55 [==============================] - 0s 2ms/step - loss: 9137266.0000
Epoch 12/100
55/55 [==============================] - 0s 2ms/step - loss: 9122138.0000
Epoch 13/100
55/55 [==============================] - 0s 2ms/step - loss: 9105495.0000
Epoch 14/100
55/55 [==============================] - 0s 2ms/step - loss: 9087431.0000
Epoch 15/100
55/55 [==============================] - 0s 2ms/step - loss: 9068019.0000
Epoch 16/100
55/55 [==============================] - 0s 2ms/step - loss: 9047325.0000
Epoch 17/100
55/55 [==============================] - 0s 2ms/step - loss: 9025398.0000
Epoch 18/100
55/55 [==============================] - 0s 2ms/step - loss: 9002286.0000
Epoch 19/100
55/55 [==============================] - 0s 2ms/step - loss: 8978026.0000
Epoch 20/100
55/55 [==============================] - 0s 2ms/step - loss: 8952648.0000
Epoch 21/100
55/55 [==============================] - 0s 2ms/step - loss: 8926185.0000
Epoch 22/100
55/55 [==============================] - 0s 2ms/step - loss: 8898657.0000
Epoch 23/100
55/55 [==============================] - 0s 2ms/step - loss: 8870096.0000
Epoch 24/100
55/55 [==============================] - 0s 2ms/step - loss: 8840519.0000
Epoch 25/100
55/55 [==============================] - 0s 2ms/step - loss: 8809951.0000
Epoch 26/100
55/55 [==============================] - 0s 2ms/step - loss: 8778413.0000
Epoch 27/100
55/55 [==============================] - 0s 2ms/step - loss: 8745932.0000
Epoch 28/100
55/55 [==============================] - 0s 2ms/step - loss: 8712528.0000
Epoch 29/100
55/55 [==============================] - 0s 2ms/step - loss: 8678224.0000
Epoch 30/100
55/55 [==============================] - 0s 2ms/step - loss: 8643041.0000
Epoch 31/100
55/55 [==============================] - 0s 2ms/step - loss: 8607000.0000
Epoch 32/100
55/55 [==============================] - 0s 2ms/step - loss: 8570122.0000
Epoch 33/100
55/55 [==============================] - 0s 2ms/step - loss: 8532432.0000
Epoch 34/100
55/55 [==============================] - 0s 2ms/step - loss: 8493947.0000
Epoch 35/100
55/55 [==============================] - 0s 2ms/step - loss: 8454683.0000
Epoch 36/100
55/55 [==============================] - 0s 2ms/step - loss: 8414665.0000
Epoch 37/100
55/55 [==============================] - 0s 2ms/step - loss: 8373909.5000
Epoch 38/100
55/55 [==============================] - 0s 2ms/step - loss: 8332427.5000
Epoch 39/100
55/55 [==============================] - 0s 2ms/step - loss: 8290243.5000
Epoch 40/100
55/55 [==============================] - 0s 2ms/step - loss: 8247370.5000
Epoch 41/100
55/55 [==============================] - 0s 2ms/step - loss: 8203827.5000
Epoch 42/100
55/55 [==============================] - 0s 2ms/step - loss: 8159629.0000
Epoch 43/100
55/55 [==============================] - 0s 2ms/step - loss: 8114792.0000
Epoch 44/100
55/55 [==============================] - 0s 2ms/step - loss: 8069340.5000
Epoch 45/100
55/55 [==============================] - 0s 2ms/step - loss: 8023282.5000
Epoch 46/100
55/55 [==============================] - 0s 2ms/step - loss: 7976617.0000
Epoch 47/100
55/55 [==============================] - 0s 2ms/step - loss: 7929378.5000
Epoch 48/100
55/55 [==============================] - 0s 2ms/step - loss: 7881578.5000
Epoch 49/100
55/55 [==============================] - 0s 2ms/step - loss: 7833241.5000
Epoch 50/100
55/55 [==============================] - 0s 2ms/step - loss: 7784374.5000
Epoch 51/100
55/55 [==============================] - 0s 2ms/step - loss: 7734993.5000
Epoch 52/100
55/55 [==============================] - 0s 2ms/step - loss: 7685115.5000
Epoch 53/100
55/55 [==============================] - 0s 2ms/step - loss: 7634753.5000
Epoch 54/100
55/55 [==============================] - 0s 2ms/step - loss: 7583914.5000
Epoch 55/100
55/55 [==============================] - 0s 2ms/step - loss: 7532596.0000
Epoch 56/100
55/55 [==============================] - 0s 2ms/step - loss: 7480808.5000
Epoch 57/100
55/55 [==============================] - 0s 2ms/step - loss: 7428594.0000
Epoch 58/100
55/55 [==============================] - 0s 2ms/step - loss: 7375934.0000
Epoch 59/100
55/55 [==============================] - 0s 2ms/step - loss: 7322847.0000
Epoch 60/100
55/55 [==============================] - 0s 2ms/step - loss: 7269368.0000
Epoch 61/100
55/55 [==============================] - 0s 2ms/step - loss: 7215500.5000
Epoch 62/100
55/55 [==============================] - 0s 2ms/step - loss: 7161252.0000
Epoch 63/100
55/55 [==============================] - 0s 2ms/step - loss: 7106643.0000
Epoch 64/100
55/55 [==============================] - 0s 2ms/step - loss: 7051678.0000
Epoch 65/100
55/55 [==============================] - 0s 2ms/step - loss: 6996349.0000
Epoch 66/100
55/55 [==============================] - 0s 2ms/step - loss: 6940671.0000
Epoch 67/100
55/55 [==============================] - 0s 2ms/step - loss: 6884678.5000
Epoch 68/100
55/55 [==============================] - 0s 2ms/step - loss: 6828371.0000
Epoch 69/100
55/55 [==============================] - 0s 2ms/step - loss: 6771765.5000
Epoch 70/100
55/55 [==============================] - 0s 2ms/step - loss: 6714879.0000
Epoch 71/100
55/55 [==============================] - 0s 2ms/step - loss: 6657717.5000
Epoch 72/100
55/55 [==============================] - 0s 2ms/step - loss: 6600276.0000
Epoch 73/100
55/55 [==============================] - 0s 2ms/step - loss: 6542559.5000
Epoch 74/100
55/55 [==============================] - 0s 2ms/step - loss: 6484557.0000
Epoch 75/100
55/55 [==============================] - 0s 2ms/step - loss: 6426323.0000
Epoch 76/100
55/55 [==============================] - 0s 2ms/step - loss: 6367853.0000
Epoch 77/100
55/55 [==============================] - 0s 2ms/step - loss: 6309151.5000
Epoch 78/100
55/55 [==============================] - 0s 2ms/step - loss: 6250264.5000
Epoch 79/100
55/55 [==============================] - 0s 2ms/step - loss: 6191185.5000
Epoch 80/100
55/55 [==============================] - 0s 2ms/step - loss: 6131942.0000
Epoch 81/100
55/55 [==============================] - 0s 2ms/step - loss: 6072522.0000
Epoch 82/100
55/55 [==============================] - 0s 2ms/step - loss: 6012927.5000
Epoch 83/100
55/55 [==============================] - 0s 2ms/step - loss: 5953168.5000
Epoch 84/100
55/55 [==============================] - 0s 2ms/step - loss: 5893237.0000
Epoch 85/100
55/55 [==============================] - 0s 2ms/step - loss: 5833184.0000
Epoch 86/100
55/55 [==============================] - 0s 2ms/step - loss: 5773000.5000
Epoch 87/100
55/55 [==============================] - 0s 2ms/step - loss: 5712696.0000
Epoch 88/100
55/55 [==============================] - 0s 2ms/step - loss: 5652247.5000
Epoch 89/100
55/55 [==============================] - 0s 2ms/step - loss: 5591696.0000
Epoch 90/100
55/55 [==============================] - 0s 2ms/step - loss: 5531036.5000
Epoch 91/100
55/55 [==============================] - 0s 2ms/step - loss: 5470270.0000
Epoch 92/100
55/55 [==============================] - 0s 2ms/step - loss: 5409435.5000
Epoch 93/100
55/55 [==============================] - 0s 2ms/step - loss: 5348541.5000
Epoch 94/100
55/55 [==============================] - 0s 2ms/step - loss: 5287612.0000
Epoch 95/100
55/55 [==============================] - 0s 2ms/step - loss: 5226639.5000
Epoch 96/100
55/55 [==============================] - 0s 2ms/step - loss: 5165641.5000
Epoch 97/100
55/55 [==============================] - 0s 2ms/step - loss: 5104644.0000
Epoch 98/100
55/55 [==============================] - 0s 2ms/step - loss: 5043662.5000
Epoch 99/100
55/55 [==============================] - 0s 2ms/step - loss: 4982696.5000
Epoch 100/100
55/55 [==============================] - 0s 2ms/step - loss: 4921746.5000
plt.figure(figsize=(20, 10))
plt.plot(y_test, label='True Value')
plt.plot(y_pred, label='LSTM Value')
plt.title("Prediction by LSTM For Amazon", size = 24)
plt.xlabel('Time Scale', size = 14)
plt.ylabel('Scaled USD', size = 14)
plt.legend()
plt.show()
#Set Target Variable
output_var = pd.DataFrame(facebook['Close/Last'])
#Selecting the Features
features = ['Open', 'High', 'Low', 'Volume']
#Scaling
scaler = StandardScaler()
feature_transform = scaler.fit_transform(facebook[features])
feature_transform= pd.DataFrame(columns=features, data=feature_transform, index= facebook.index)
feature_transform.head()
#Splitting to Training set and Test set
timesplit= TimeSeriesSplit(n_splits=10)
for train_index, test_index in timesplit.split(feature_transform):
X_train, X_test = feature_transform[:len(train_index)], feature_transform[len(train_index): (len(train_index)+len(test_index))]
y_train, y_test = output_var[:len(train_index)].values.ravel(), output_var[len(train_index): (len(train_index)+len(test_index))].values.ravel()
#Process the data for LSTM
trainX =np.array(X_train)
testX =np.array(X_test)
X_train = trainX.reshape(X_train.shape[0], 1, X_train.shape[1])
X_test = testX.reshape(X_test.shape[0], 1, X_test.shape[1])
#Building the LSTM Model
lstm = Sequential()
lstm.add(LSTM(32, input_shape=(1, trainX.shape[1]), activation='relu', return_sequences=False))
lstm.add(Dense(1))
lstm.compile(loss='mean_squared_error', optimizer='adam')
plot_model(lstm, show_shapes=True, show_layer_names=True)
history=lstm.fit(X_train, y_train, epochs=100, batch_size=8, verbose=1, shuffle=False)
#LSTM Prediction
y_pred= lstm.predict(X_test)
('You must install pydot (`pip install pydot`) and install graphviz (see instructions at https://graphviz.gitlab.io/download/) ', 'for plot_model/model_to_dot to work.')
Epoch 1/100
55/55 [==============================] - 1s 3ms/step - loss: 77644.3750
Epoch 2/100
55/55 [==============================] - 0s 2ms/step - loss: 77531.1719
Epoch 3/100
55/55 [==============================] - 0s 2ms/step - loss: 77353.7422
Epoch 4/100
55/55 [==============================] - 0s 2ms/step - loss: 77064.3438
Epoch 5/100
55/55 [==============================] - 0s 2ms/step - loss: 76616.5859
Epoch 6/100
55/55 [==============================] - 0s 2ms/step - loss: 75971.4219
Epoch 7/100
55/55 [==============================] - 0s 2ms/step - loss: 75105.2031
Epoch 8/100
55/55 [==============================] - 0s 2ms/step - loss: 74015.0859
Epoch 9/100
55/55 [==============================] - 0s 2ms/step - loss: 72714.2422
Epoch 10/100
55/55 [==============================] - 0s 2ms/step - loss: 71223.7578
Epoch 11/100
55/55 [==============================] - 0s 2ms/step - loss: 69562.8203
Epoch 12/100
55/55 [==============================] - 0s 2ms/step - loss: 67744.9688
Epoch 13/100
55/55 [==============================] - 0s 2ms/step - loss: 65786.7969
Epoch 14/100
55/55 [==============================] - 0s 2ms/step - loss: 63709.7266
Epoch 15/100
55/55 [==============================] - 0s 2ms/step - loss: 61537.0586
Epoch 16/100
55/55 [==============================] - 0s 2ms/step - loss: 59291.0859
Epoch 17/100
55/55 [==============================] - 0s 2ms/step - loss: 56991.6055
Epoch 18/100
55/55 [==============================] - 0s 2ms/step - loss: 54656.3906
Epoch 19/100
55/55 [==============================] - 0s 2ms/step - loss: 52301.2812
Epoch 20/100
55/55 [==============================] - 0s 2ms/step - loss: 49940.4414
Epoch 21/100
55/55 [==============================] - 0s 2ms/step - loss: 47587.3516
Epoch 22/100
55/55 [==============================] - 0s 2ms/step - loss: 45254.5703
Epoch 23/100
55/55 [==============================] - 0s 2ms/step - loss: 42953.7266
Epoch 24/100
55/55 [==============================] - 0s 2ms/step - loss: 40695.6523
Epoch 25/100
55/55 [==============================] - 0s 2ms/step - loss: 38490.3672
Epoch 26/100
55/55 [==============================] - 0s 2ms/step - loss: 36346.9023
Epoch 27/100
55/55 [==============================] - 0s 2ms/step - loss: 34273.3984
Epoch 28/100
55/55 [==============================] - 0s 2ms/step - loss: 32277.1211
Epoch 29/100
55/55 [==============================] - 0s 2ms/step - loss: 30364.1914
Epoch 30/100
55/55 [==============================] - 0s 2ms/step - loss: 28539.6816
Epoch 31/100
55/55 [==============================] - 0s 2ms/step - loss: 26807.4434
Epoch 32/100
55/55 [==============================] - 0s 2ms/step - loss: 25170.1875
Epoch 33/100
55/55 [==============================] - 0s 2ms/step - loss: 23629.5234
Epoch 34/100
55/55 [==============================] - 0s 2ms/step - loss: 22185.9199
Epoch 35/100
55/55 [==============================] - 0s 2ms/step - loss: 20838.6211
Epoch 36/100
55/55 [==============================] - 0s 2ms/step - loss: 19586.0391
Epoch 37/100
55/55 [==============================] - 0s 2ms/step - loss: 18425.6738
Epoch 38/100
55/55 [==============================] - 0s 2ms/step - loss: 17354.2852
Epoch 39/100
55/55 [==============================] - 0s 2ms/step - loss: 16367.9326
Epoch 40/100
55/55 [==============================] - 0s 2ms/step - loss: 15462.2236
Epoch 41/100
55/55 [==============================] - 0s 2ms/step - loss: 14632.6377
Epoch 42/100
55/55 [==============================] - 0s 2ms/step - loss: 13874.0283
Epoch 43/100
55/55 [==============================] - 0s 2ms/step - loss: 13181.1504
Epoch 44/100
55/55 [==============================] - 0s 2ms/step - loss: 12548.6387
Epoch 45/100
55/55 [==============================] - 0s 2ms/step - loss: 11971.2334
Epoch 46/100
55/55 [==============================] - 0s 2ms/step - loss: 11443.6387
Epoch 47/100
55/55 [==============================] - 0s 2ms/step - loss: 10960.8945
Epoch 48/100
55/55 [==============================] - 0s 2ms/step - loss: 10517.9590
Epoch 49/100
55/55 [==============================] - 0s 2ms/step - loss: 10109.9658
Epoch 50/100
55/55 [==============================] - 0s 2ms/step - loss: 9732.7998
Epoch 51/100
55/55 [==============================] - 0s 2ms/step - loss: 9382.4941
Epoch 52/100
55/55 [==============================] - 0s 2ms/step - loss: 9055.2451
Epoch 53/100
55/55 [==============================] - 0s 2ms/step - loss: 8747.5693
Epoch 54/100
55/55 [==============================] - 0s 2ms/step - loss: 8456.1904
Epoch 55/100
55/55 [==============================] - 0s 2ms/step - loss: 8178.9971
Epoch 56/100
55/55 [==============================] - 0s 2ms/step - loss: 7914.0903
Epoch 57/100
55/55 [==============================] - 0s 2ms/step - loss: 7659.4854
Epoch 58/100
55/55 [==============================] - 0s 2ms/step - loss: 7413.7041
Epoch 59/100
55/55 [==============================] - 0s 2ms/step - loss: 7175.4800
Epoch 60/100
55/55 [==============================] - 0s 2ms/step - loss: 6943.7876
Epoch 61/100
55/55 [==============================] - 0s 2ms/step - loss: 6717.8164
Epoch 62/100
55/55 [==============================] - 0s 2ms/step - loss: 6496.9146
Epoch 63/100
55/55 [==============================] - 0s 2ms/step - loss: 6280.5342
Epoch 64/100
55/55 [==============================] - 0s 2ms/step - loss: 6068.5649
Epoch 65/100
55/55 [==============================] - 0s 2ms/step - loss: 5860.7266
Epoch 66/100
55/55 [==============================] - 0s 2ms/step - loss: 5656.8467
Epoch 67/100
55/55 [==============================] - 0s 2ms/step - loss: 5456.4424
Epoch 68/100
55/55 [==============================] - 0s 2ms/step - loss: 5259.8491
Epoch 69/100
55/55 [==============================] - 0s 2ms/step - loss: 5067.7900
Epoch 70/100
55/55 [==============================] - 0s 2ms/step - loss: 4879.6260
Epoch 71/100
55/55 [==============================] - 0s 2ms/step - loss: 4695.3696
Epoch 72/100
55/55 [==============================] - 0s 2ms/step - loss: 4514.8652
Epoch 73/100
55/55 [==============================] - 0s 5ms/step - loss: 4338.1953
Epoch 74/100
55/55 [==============================] - 0s 2ms/step - loss: 4165.3267
Epoch 75/100
55/55 [==============================] - 0s 2ms/step - loss: 3996.3633
Epoch 76/100
55/55 [==============================] - 0s 2ms/step - loss: 3831.0815
Epoch 77/100
55/55 [==============================] - 0s 2ms/step - loss: 3669.9590
Epoch 78/100
55/55 [==============================] - 0s 2ms/step - loss: 3513.1731
Epoch 79/100
55/55 [==============================] - 0s 2ms/step - loss: 3360.4900
Epoch 80/100
55/55 [==============================] - 0s 2ms/step - loss: 3212.0732
Epoch 81/100
55/55 [==============================] - 0s 2ms/step - loss: 3067.7209
Epoch 82/100
55/55 [==============================] - 0s 2ms/step - loss: 2927.5518
Epoch 83/100
55/55 [==============================] - 0s 2ms/step - loss: 2791.5952
Epoch 84/100
55/55 [==============================] - 0s 2ms/step - loss: 2659.9424
Epoch 85/100
55/55 [==============================] - 0s 2ms/step - loss: 2532.5005
Epoch 86/100
55/55 [==============================] - 0s 2ms/step - loss: 2409.2866
Epoch 87/100
55/55 [==============================] - 0s 2ms/step - loss: 2290.3806
Epoch 88/100
55/55 [==============================] - 0s 2ms/step - loss: 2175.5903
Epoch 89/100
55/55 [==============================] - 0s 2ms/step - loss: 2065.0315
Epoch 90/100
55/55 [==============================] - 0s 2ms/step - loss: 1958.6033
Epoch 91/100
55/55 [==============================] - 0s 3ms/step - loss: 1856.1901
Epoch 92/100
55/55 [==============================] - 0s 3ms/step - loss: 1757.7754
Epoch 93/100
55/55 [==============================] - 0s 2ms/step - loss: 1663.4338
Epoch 94/100
55/55 [==============================] - 0s 2ms/step - loss: 1573.0442
Epoch 95/100
55/55 [==============================] - 0s 2ms/step - loss: 1486.7606
Epoch 96/100
55/55 [==============================] - 0s 2ms/step - loss: 1404.2928
Epoch 97/100
55/55 [==============================] - 0s 2ms/step - loss: 1325.4857
Epoch 98/100
55/55 [==============================] - 0s 2ms/step - loss: 1250.2467
Epoch 99/100
55/55 [==============================] - 0s 2ms/step - loss: 1178.6262
Epoch 100/100
55/55 [==============================] - 0s 2ms/step - loss: 1110.5089
plt.figure(figsize=(20, 10))
plt.plot(y_test, label='True Value')
plt.plot(y_pred, label='LSTM Value')
plt.title("Prediction by LSTM For Facebook",size = 24)
plt.xlabel('Time Scale',size = 14)
plt.ylabel('Scaled USD', size = 14)
plt.legend()
plt.show()
#Set Target Variable
output_var = pd.DataFrame(netflix['Close/Last'])
#Selecting the Features
features = ['Open', 'High', 'Low', 'Volume']
#Scaling
scaler = StandardScaler()
feature_transform = scaler.fit_transform(netflix[features])
feature_transform= pd.DataFrame(columns=features, data=feature_transform, index= netflix.index)
feature_transform.head()
#Splitting to Training set and Test set
timesplit= TimeSeriesSplit(n_splits=10)
for train_index, test_index in timesplit.split(feature_transform):
X_train, X_test = feature_transform[:len(train_index)], feature_transform[len(train_index): (len(train_index)+len(test_index))]
y_train, y_test = output_var[:len(train_index)].values.ravel(), output_var[len(train_index): (len(train_index)+len(test_index))].values.ravel()
#Process the data for LSTM
trainX =np.array(X_train)
testX =np.array(X_test)
X_train = trainX.reshape(X_train.shape[0], 1, X_train.shape[1])
X_test = testX.reshape(X_test.shape[0], 1, X_test.shape[1])
#Building the LSTM Model
lstm = Sequential()
lstm.add(LSTM(32, input_shape=(1, trainX.shape[1]), activation='relu', return_sequences=False))
lstm.add(Dense(1))
lstm.compile(loss='mean_squared_error', optimizer='adam')
plot_model(lstm, show_shapes=True, show_layer_names=True)
history=lstm.fit(X_train, y_train, epochs=100, batch_size=8, verbose=1, shuffle=False)
#LSTM Prediction
y_pred= lstm.predict(X_test)
plt.figure(figsize=(20, 10))
plt.plot(y_test, label='True Value')
plt.plot(y_pred, label='LSTM Value')
plt.title("Prediction by LSTM For Netflix", size = 24)
plt.xlabel('Time Scale', size = 14)
plt.ylabel('Scaled USD', size = 14)
plt.legend()
plt.show()
We can tell from the plot, LSTM could predict stock price in a good way. An LSTM module has. allows it to model both long-term and short-term data.
for c,i in zip(new, comp):
df = c[['Close/Last']].copy(deep=True)
p_days = 200
df['Prediction'] = df[['Close/Last']].shift(-p_days)
X = np.array(df.drop(['Prediction'], 1))[:-p_days]
y = np.array(df['Prediction'])[:-p_days]
x_train, x_test, y_train, y_test = train_test_split(X, y, test_size = 0.25)
lr = LinearRegression().fit(x_train, y_train)
knn = KNeighborsRegressor().fit(x_train, y_train)
tree = DecisionTreeRegressor().fit(x_train, y_train)
x_future = df.drop(['Prediction'], 1)[:-p_days]
x_future = x_future.tail(p_days)
x_future = np.array(x_future)
tree_prediction = tree.predict(x_future)
lr_prediction = lr.predict(x_future)
knn_prediction = knn.predict(x_future)
predictions = tree_prediction
valid = df[X.shape[0]:].copy(deep=True)
valid['Predictions'] = predictions
valid[['Close/Last','Predictions']]
fig = go.Figure()
fig.add_trace(go.Scatter(x=df.index.values, y=df['Close/Last'], name='Actual Close',line=dict(width=1.5)))
fig.add_trace(go.Scatter(x=valid.index.values, y=valid['Predictions'], name='Predicted D-Tree',line=dict(width=1.5)))
predictions = knn_prediction
valid = df[X.shape[0]:].copy(deep=True)
valid['Predictions'] = predictions
valid[['Close/Last','Predictions']]
fig.add_trace(go.Scatter(x=valid.index.values, y=valid['Predictions'], name='Predicted k-NN',marker_color='gold',line=dict(width=1.5)))
predictions = lr_prediction
valid = df[X.shape[0]:].copy(deep=True)
valid['Predictions'] = predictions
valid[['Close/Last','Predictions']]
fig.add_trace(go.Scatter(x=valid.index.values, y=valid['Predictions'], name='Predicted Li Reg',line=dict(width=1.5)))
fig.update_layout(
updatemenus=[
dict(
buttons=list([dict(label = 'All',method = 'update',args = [{'visible': [True, True, True, True]},{'title': 'Prediction Overview','showlegend':True}]),
dict(label = 'Decision Tree Prediction',method = 'update',args = [{'visible': [True, True, False, False]},{'title': 'Linear Regression for ' + i,'showlegend':True}]),
dict(label = 'Linear Regression Prediction',method = 'update',args = [{'visible': [True, False, True, False]},{'title': 'k-NN Regressor for ' + i,'showlegend':True}]),
dict(label = 'k-NN Regressor Prediction', method = 'update', args = [{'visible': [True, False, False, True]},{'title': 'Decision Tree Regressor for ' + i,'showlegend':True}]),]), direction="down", pad={"r": 10, "t": 10},showactive=True,x=0.1,xanchor="left",y=1.1,yanchor="top"), ])
fig.update_layout(title='Prediction of Close Price for ' + i,xaxis_title='Date',yaxis_title='Close Price')
fig.update_layout(autosize=False,width=1000,height=650,)
iplot(fig,show_link=False)
/tmp/ipykernel_5504/1142953186.py:7: FutureWarning: In a future version of pandas all arguments of DataFrame.drop except for the argument 'labels' will be keyword-only /tmp/ipykernel_5504/1142953186.py:17: FutureWarning: In a future version of pandas all arguments of DataFrame.drop except for the argument 'labels' will be keyword-only
/tmp/ipykernel_5504/1142953186.py:7: FutureWarning: In a future version of pandas all arguments of DataFrame.drop except for the argument 'labels' will be keyword-only /tmp/ipykernel_5504/1142953186.py:17: FutureWarning: In a future version of pandas all arguments of DataFrame.drop except for the argument 'labels' will be keyword-only
/tmp/ipykernel_5504/1142953186.py:7: FutureWarning: In a future version of pandas all arguments of DataFrame.drop except for the argument 'labels' will be keyword-only /tmp/ipykernel_5504/1142953186.py:17: FutureWarning: In a future version of pandas all arguments of DataFrame.drop except for the argument 'labels' will be keyword-only
/tmp/ipykernel_5504/1142953186.py:7: FutureWarning: In a future version of pandas all arguments of DataFrame.drop except for the argument 'labels' will be keyword-only /tmp/ipykernel_5504/1142953186.py:17: FutureWarning: In a future version of pandas all arguments of DataFrame.drop except for the argument 'labels' will be keyword-only
/tmp/ipykernel_5504/1142953186.py:7: FutureWarning: In a future version of pandas all arguments of DataFrame.drop except for the argument 'labels' will be keyword-only /tmp/ipykernel_5504/1142953186.py:17: FutureWarning: In a future version of pandas all arguments of DataFrame.drop except for the argument 'labels' will be keyword-only
The inference I can draw from this graph is that the k-NN Regressor model gives us one of the most accurate predictions looking at the accuracy scores. Although, there are a couple outliers in the graphs that could possibly change the predictions. However, these outliers happen to be very insignificant and can always change based on the trading model that companies decide to adopt.
The aim of this project is to evaluate whether COVID-19 cases and deaths,explain and predict FAANG stock market in COVID -19 period. I find that both COVID-19 cases and deaths related to COVID-19 have contemporary relationships and predictive abilities on abnormal stock prices. These shocks affect investment decisions and the subsequent stock price dynamics.